31 research outputs found

    Monitoring different phonological parameters of sign language engages the same cortical language network but distinctive perceptual ones

    Get PDF
    The study of signed languages allows the dissociation of sensorimotor and cognitive neural components of the language signal. Here we investigated the neurocognitive processes underlying the monitoring of two phonological parameters of sign languages: handshape and location. Our goal was to determine if brain regions processing sensorimotor characteristics of different phonological parameters of sign languages were also involved in phonological processing, with their activity being modulated by the linguistic content of manual actions. We conducted an fMRI experiment using manual actions varying in phonological structure and semantics: (1) signs of a familiar sign language (British Sign Language), (2) signs of an unfamiliar sign language (Swedish Sign Language), and (3) invented nonsigns that violate the phonological rules of British Sign Language and Swedish Sign Language or consist of nonoccurring combinations of phonological parameters. Three groups of participants were tested: deaf native signers, deaf nonsigners, and hearing nonsigners. Results show that the linguistic processing of different phonological parameters of sign language is independent of the sensorimotor characteristics of the language signal. Handshape and location were processed by different perceptual and task-related brain networks but recruited the same language areas. The semantic content of the stimuli did not influence this process, but phonological structure did, with nonsigns being associated with longer RTs and stronger activations in an action observation network in all participants and in the supramarginal gyrus exclusively in deaf signers. These results suggest higher processing demands for stimuli that contravene the phonological rules of a signed language, independently of previous knowledge of signed languages. We suggest that the phonological characteristics of a language may arise as a consequence of more efficient neural processing for its perception and production

    Differential activity in Heschl's gyrus between deaf and hearing individuals is due to auditory deprivation rather than language modality

    Get PDF
    Sensory cortices undergo crossmodal reorganisation as a consequence of sensory deprivation. Congenital deafness in humans represents a particular case with respect to other types of sensory deprivation, because cortical reorganisation is not only a consequence of auditory deprivation, but also of language-driven mechanisms. Visual crossmodal plasticity has been found in secondary auditory cortices of deaf individuals, but it is still unclear if reorganisation also takes place in primary auditory areas, and how this relates to language modality and auditory deprivation.  Here, we dissociated the effects of language modality and auditory deprivation on crossmodal plasticity in Heschl's gyrus as a whole, and in cytoarchitectonic region Te1.0 (likely to contain the core auditory cortex). Using fMRI, we measured the BOLD response to viewing sign language in congenitally or early deaf individuals with and without sign language knowledge, and in hearing controls.  Results show that differences between hearing and deaf individuals are due to a reduction in activation caused by visual stimulation in the hearing group, which is more significant in Te1.0 than in Heschl's gyrus as a whole. Furthermore, differences between deaf and hearing groups are due to auditory deprivation, and there is no evidence that the modality of language used by deaf individuals contributes to crossmodal plasticity in Heschl's gyrus

    Language switching and the effects of orthographic specificity and response repetition

    No full text
    In two experiments, Greek-English bilinguals alternated between performing a lexical decision task in Greek and in English. The cost to performance on switch trials interacted with response repetition, implying that a source of this “switch cost” is at the level of response mapping or initiation. Orthographic specificity also affected switch cost. Greek and English have partially overlapping alphabets, which enabled us to manipulate language specificity at the letter level, rather than only at the level of letter clusters. Language-nonspecific stimuli used only symbols common to both Greek and English, whereas language-specific stimuli contained letters unique to just one language. The switch cost was markedly reduced by such language-specific orthography, and this effect did not interact with the effect of response repetition, implying a separate, stimulus-sensitive source of switch costs. However, we argue that this second source is not within the word-recognition system, but at the level of task schemas, because the reduction of switch cost with language-specific stimuli was abolished when these stimuli were intermingled with language-nonspecific stimuli

    Asymmetric Morphological Priming Among Inflected and Derived Verbs and Nouns in Greek

    Get PDF
    The present study examined differences between inflectional and derivational morphology using Greek nouns and verbs with masked priming (with both short and long stimulus onset asynchrony) and long-lag priming. A lexical decision task to inflected noun and verb targets was used to test whether their processing is differentially facilitated by prior presentation of their stem in words of the same grammatical class (inflectional morphology) or of a different grammatical class (derivational morphology). Differences in semantics, syntactic information, and morphological complexity between inflected and derived word pairs (both nouns and verbs) were minimized by unusually tight control of stimuli as permitted by Greek morphology. Results showed that morphological relations affected processing of morphologically complex Greek words (nouns and verbs) across prime durations (50–250ms) as well as when items intervened between primes and targets. In two of the four experiments (Experiments 1 and 3), inflectionally related primes produced significantly greater effects than derivationally related primes suggesting differences in processing inflectional versus derivational morphological relations, which may disappear when processing is less dependent on semantic effects (Experiment 4). Priming effects differed for verb vs. noun targets with long SOA priming (Experiment 3), consistent with processing differences between complex words of different grammatical class (nouns and verbs) when semantic effects are maximized. Taken together, results demonstrate that inflectional and derivational relations differentially affect processing complex words of different grammatical class (nouns and verbs). This finding indicates that distinctions of morphological relation (inflectional vs. derivational) are not of the same kind as distinctions of grammatical class (nouns vs. verbs). Asymmetric differences among inflected and derived verbs and nouns seem to depend on semantic effects and/or processing demands modulating priming effects very early in lexical processing of morphologically complex written words, consistent with models of lexical processing positing early access to morphological structure and early influence of semantics

    Experiment on Verification of a Planetary Rover Controller

    No full text
    International audienceIn this paper, we report an experiment on the verification of the K9 Rover Executive, an experimental platform for autonomous vehicles targeted for the exploration of the Martian surface developed at NASA Ames. The Executive provides a means to control and command the vehicles through predefined plans, that are hierarchical descriptions of actions annotated with real-time constraints. The verification concerns the correctness of the Executive, which must execute the plans according to their semantics
    corecore